Riemannian Newton and conjugate gradient algorithm for computing Lagrangian invariant subspaces ∗

نویسنده

  • M. Kleinsteuber
چکیده

The computation of Lagrangian invariant subspaces of a Hamiltonian matrix, or the closely related task of solving algebraic Riccati equations, is an important issue in linear optimal control, stochastic control and H∞-design. We propose a new class of Riemannian Newton methods that allows to compute isolated Lagrangian invariant subspaces of a Hamiltonian matrix. The algorithm implements a variant of the Newton method for a quadratic vector field on the Lagrange Graßmann manifold. It yields new methods to solve the algebraic Riccati equation in linear optimal control. In addition, an intrinsic conjugate gradient algorithm on the Lagrangian Grassmanian is introduced.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On a selective reuse of Krylov subspaces in Newton-Krylov approaches for nonlinear elasticity

1. Introduction. We consider the resolution of large-scale nonlinear problems arising from the finite-element discretization of geometrically non-linear structural analysis problems. We use a classical Newton Raphson algorithm to handle the non-linearity which leads to the resolution of a sequence of linear systems with non-invariant matrices and right hand sides. The linear systems are solved ...

متن کامل

Riemannian Geometry of Grassmann Manifolds with a View on Algorithmic Computation

We give simple formulas for the canonical metric, gradient, Lie derivative, Riemannian connection, parallel translation, geodesics and distance on the Grassmann manifold of p-planes in Rn. In these formulas, p-planes are represented as the column space of n £ p matrices. The Newton method on abstract Riemannian manifolds proposed by S. T. Smith is made explicit on the Grassmann manifold. Two ap...

متن کامل

Second-order adjoint state methods for Full Waveform Inversion

Full Waveform Inversion (FWI) applications classically rely on efficient first-order optimization schemes, as the steepest descent or the nonlinear conjugate gradient optimization. However, second-order information provided by the Hessian matrix is proven to give a useful help in the scaling of the FWI problem and in the speed-up of the optimization. In this study, we propose an efficient matri...

متن کامل

A Globally Convergent Conjugate Gradient Method for Minimizing Self-Concordant Functions on Riemannian Manifolds

Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton m...

متن کامل

Optimization Methods on Riemannian Manifolds and Their Application to Shape Space

We extend the scope of analysis for linesearch optimization algorithms on (possibly infinitedimensional) Riemannian manifolds to the convergence analysis of the BFGS quasi-Newton scheme and the Fletcher–Reeves conjugate gradient iteration. Numerical implementations for exemplary problems in shape spaces show the practical applicability of these methods.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008